Search results for " information theory"
showing 10 items of 51 documents
Extropy: Complementary Dual of Entropy
2015
This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments…
Shallow Reductionism and the Problem of Complexity in Psychology
2008
In his recent book The Mind Doesn't Work That Way, Fodor argues that computational modeling of global cognitive processes, such as abductive everyday reasoning, has not been successful. In this article the problem is analyzed in the framework of algorithmic information theory. It is argued that the failed approaches are characterized by shallow reductionism, which is rejected in favor of deep reductionism and nonreductionism.
Quantifying, characterizing, and controlling information flow in ultracold atomic gases
2011
We study quantum information flow in a model comprising of an impurity qubit immersed in a Bose-Einstein condensed reservoir. We demonstrate how information flux between the qubit and the condensate can be manipulated by engineering the ultracold reservoir within experimentally realistic limits. We place a particular emphasis on non-Markovian dynamics, characterized by a reversed flow of information from the background gas to the qubit and identify a controllable crossover between Markovian and non-Markovian dynamics in the parameter space of the model.
Multiscale Information Storage of Linear Long-Range Correlated Stochastic Processes
2019
Information storage, reflecting the capability of a dynamical system to keep predictable information during its evolution over time, is a key element of intrinsic distributed computation, useful for the description of the dynamical complexity of several physical and biological processes. Here we introduce a parametric approach which allows one to compute information storage across multiple timescales in stochastic processes displaying both short-term dynamics and long-range correlations (LRC). Our analysis is performed in the popular framework of multiscale entropy, whereby a time series is first "coarse grained" at the chosen timescale through low-pass filtering and downsampling, and then …
Entanglement generation between two spin-s magnetic impurities in a solid via electron scattering
2009
Abstract We present a scheme for generating entanglement between two magnetic impurities in a solid-state system via electron scattering. The scheme applies to impurities of arbitrary quantum spin number. We show that resonance conditions yield generation of a maximally entangled state of the impurities' spins, regardless of the value of the electron–impurity coupling constant and the impurity spin quantum number. The mechanism behind the scheme is explained in terms of resonance-induced selection rules.
Algorithmic Information Theory and Computational Complexity
2013
We present examples where theorems on complexity of computation are proved using methods in algorithmic information theory. The first example is a non-effective construction of a language for which the size of any deterministic finite automaton exceeds the size of a probabilistic finite automaton with a bounded error exponentially. The second example refers to frequency computation. Frequency computation was introduced by Rose and McNaughton in early sixties and developed by Trakhtenbrot, Kinber, Degtev, Wechsung, Hinrichs and others. A transducer is a finite-state automaton with an input and an output. We consider the possibilities of probabilistic and frequency transducers and prove sever…
Combinatorial proofs of two theorems of Lutz and Stull
2021
Recently, Lutz and Stull used methods from algorithmic information theory to prove two new Marstrand-type projection theorems, concerning subsets of Euclidean space which are not assumed to be Borel, or even analytic. One of the theorems states that if $K \subset \mathbb{R}^{n}$ is any set with equal Hausdorff and packing dimensions, then $$ \dim_{\mathrm{H}} π_{e}(K) = \min\{\dim_{\mathrm{H}} K,1\} $$ for almost every $e \in S^{n - 1}$. Here $π_{e}$ stands for orthogonal projection to $\mathrm{span}(e)$. The primary purpose of this paper is to present proofs for Lutz and Stull's projection theorems which do not refer to information theoretic concepts. Instead, they will rely on combinatori…
Decoding algorithm for HL-codes and performance of the DHH-cryptosystem -- a candidate for post-quantum cryptography
2023
We give a decoding algorithm for a class of error-correcting codes, which can be used in the DHH-cryptosystem, which is a candidate for post-quantum cryptography, since it is of McEliece type. Furthermore, we implement the encryption and decryption algorithms for this cryptosystem and investigate its performance.
End-to-end Optimized Image Compression
2016
We describe an image compression method, consisting of a nonlinear analysis transformation, a uniform quantizer, and a nonlinear synthesis transformation. The transforms are constructed in three successive stages of convolutional linear filters and nonlinear activation functions. Unlike most convolutional neural networks, the joint nonlinearity is chosen to implement a form of local gain control, inspired by those used to model biological neurons. Using a variant of stochastic gradient descent, we jointly optimize the entire model for rate-distortion performance over a database of training images, introducing a continuous proxy for the discontinuous loss function arising from the quantizer.…
Nash codes for noisy channels
2012
This paper studies the stability of communication protocols that deal with transmission errors. We consider a coordination game between an informed sender and an uninformed decision maker, the receiver, who communicate over a noisy channel. The sender's strategy, called a code, maps states of nature to signals. The receiver's best response is to decode the received channel output as the state with highest expected receiver payoff. Given this decoding, an equilibrium or "Nash code" results if the sender encodes every state as prescribed. We show two theorems that give sufficient conditions for Nash codes. First, a receiver-optimal code defines a Nash code. A second, more surprising observati…